Dual-Attention-Driven Multiscale Fusion Object Searching Network for Remote Sensing Imagery

نویسندگان

چکیده

Object search is a challenging yet important task. Many efforts have been made to address this issue and achieve great progress in natural image, searching all specified types of objects from remote sensing image barely studied. In work, we are interested images. Compared person scenes, task two factors: One that usually contains large number objects, which poses challenge characterize the object features. Another images dense, easily yields erroneous localization. To these issues, propose new end-toend deep learning framework for First, multi-scale feature aggregation (MSFA) module, strengthens representation lowlevel features by fusing multi-layer The fused with richer details significantly improve accuracy search. Second, dual-attention enhancement (DAOE) module enhance channel spatial dimensions. enhanced localization dense objects. Finally, built datasets based on images, complex changes space time. experiments comparisons demonstrate state-of-the-art performance our method datasets.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Object-oriented change detection approach for high-resolution remote sensing images based on multiscale fusion

Aiming at the difficulties in change detection caused by the complexity of highresolution remote sensing images that exist in varied ecological environments and artificial objects, in order to overcome the limitations in traditional pixel-oriented change detection methods and improve the detection precision, an innovative object-oriented change detection approach based on multiscale fusion is p...

متن کامل

Object-based Image Fusion Method Based on Wavelet and Pca for Remote Sensing Imagery

In this paper, a new object-based wavelet fusion technique is presented for the fusion of multispectral (MS) and panchromatic (PAN) images to improve spatial information and preserve spectral information. The basic idea is to build a segmented label image by statistical region merging and minimum heterogeneity rule (SRMMHR) segmentation method to guide the object-based image fusion (OBIF). Ther...

متن کامل

Object-oriented subspace analysis for airborne hyperspectral remote sensing imagery

An object-oriented mapping approach based on subspace analysis of airborne hyperspectral images was investigated in this paper. Hyperspectral features were extracted based on subspace learning approaches, in order to reduce the redundancy of spectral space and extract the characteristic images for the further object-oriented classification. In this paper, three kinds of spectral feature extract...

متن کامل

Remote Sensing Image Fusion Based on Two-Stream Fusion Network

Remote sensing image fusion (or pan-sharpening) aims at generating high resolution multi-spectral (MS) image from inputs of a high spatial resolution single band panchromatic (PAN) image and a low spatial resolution multi-spectral image. In this paper, a deep convolutional neural network with two-stream inputs respectively for PAN and MS images is proposed for remote sensing image pan-sharpenin...

متن کامل

Object Model and Knowledge Database for Automated Object-based Analysis of Remote Sensing Imagery

Challenges connected with current remote sensing imagery include the complex information content provided by high resolution sensor systems as well as a corresponding amount of digital data that requires efficient handling. In order to cope with these challenges research has been carried out to develop novel methods for treatment of this kind of data and imagery. A set of methods comprised unde...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing

سال: 2022

ISSN: ['2151-1535', '1939-1404']

DOI: https://doi.org/10.1109/jstars.2022.3207302